Quadratic Mutual Information Feature Selection

نویسندگان

  • Davor Sluga
  • Uros Lotric
چکیده

We propose a novel feature selection method based on quadratic mutual information which has its roots in Cauchy–Schwarz divergence and Renyi entropy. The method uses the direct estimation of quadratic mutual information from data samples using Gaussian kernel functions, and can detect second order non-linear relations. Its main advantages are: (i) unified analysis of discrete and continuous data, excluding any discretization; and (ii) its parameter-free design. The effectiveness of the proposed method is demonstrated through an extensive comparison with mutual information feature selection (MIFS), minimum redundancy maximum relevance (MRMR), and joint mutual information (JMI) on classification and regression problem domains. The experiments show that proposed method performs comparably to the other methods when applied to classification problems, except it is considerably faster. In the case of regression, it compares favourably to the others, but is slower.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An improvement direction for filter selection techniques using information theory measures and quadratic optimization

Filter selection techniques are known for their simplicity and efficiency. However this kind of methods doesn’t take into consideration the features inter-redundancy. Consequently the un-removed redundant features remain in the final classification model, giving lower generalization performance. In this paper we propose to use a mathematical optimization method that reduces inter-features redun...

متن کامل

Feature Selection with Non-Parametric Mutual Information for Adaboost Learning

This paper describes a feature selection method based on the quadratic mutual information. We describe the needed formulation to estimate the mutual information from the data. This paper is motivated for the high time cost of the training process using the classical boosting algorithms. This method allows to reuse part of the training time used in the first training process to speed up posterio...

متن کامل

Feature Extraction by Non-Parametric Mutual Information Maximization

We present a method for learning discriminative feature transforms using as criterion the mutual information between class labels and transformed features. Instead of a commonly used mutual information measure based on Kullback-Leibler divergence, we use a quadratic divergence measure, which allows us to make an efficient non-parametric implementation and requires no prior assumptions about cla...

متن کامل

Feature Selection Using Multi Objective Genetic Algorithm with Support Vector Machine

Different approaches have been proposed for feature selection to obtain suitable features subset among all features. These methods search feature space for feature subsets which satisfies some criteria or optimizes several objective functions. The objective functions are divided into two main groups: filter and wrapper methods.  In filter methods, features subsets are selected due to some measu...

متن کامل

Feature Evaluation using Quadratic Mutual Information

Methods of feature evaluation are developed and discussed based on information theoretical learning (ITL). Mutual information was shown in the literature to be more robust and precise to evaluate a feature set. In this paper; we propose to use quadratic mutual information (QMI) for feature evaluation. The concept of information potential leads to a more clearly physical meaning of the evaluatio...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Entropy

دوره 19  شماره 

صفحات  -

تاریخ انتشار 2017